Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression

نویسندگان

چکیده

Heavy-tailed error distributions and predictors with anomalous values are ubiquitous in high-dimensional regression problems can seriously jeopardize the validity of statistical analyses if not properly addressed. For more reliable variable selection prediction under these adverse conditions, adaptive PENSE, a new robust regularized estimator, is proposed. Adaptive PENSE yields coefficient estimates even aberrant contamination or residuals. It shown that penalty leads to than other penalties, particularly presence gross outliers predictor space. further demonstrated has strong properties it possesses oracle property heavy-tailed errors without need estimate scale. Numerical studies on simulated real data sets highlight superior finite-sample performance vast range settings compared estimators case contaminated samples. An R package implementing fast algorithm for computing proposed method additional simulation results provided supplementary materials.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Elastic Net Regression

We propose a robust elastic net (REN) model for high-dimensional sparse regression and give its performance guarantees (both the statistical error bound and the optimization bound). A simple idea of trimming the inner product is applied to the elastic net model. Specifically, we robustify the covariance matrix by trimming the inner product based on the intuition that the trimmed inner product c...

متن کامل

Regularization and variable selection via the elastic net

We propose the elastic net, a new regularization and variable selection method. Real world data and a simulation study show that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation. In addition, the elastic net encourages a grouping effect, where strongly correlated predictors tend to be in or out of the model together.The elastic net is particularly...

متن کامل

Robust Estimation in Linear Regression with Molticollinearity and Sparse Models

‎One of the factors affecting the statistical analysis of the data is the presence of outliers‎. ‎The methods which are not affected by the outliers are called robust methods‎. ‎Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers‎. ‎Besides outliers‎, ‎the linear dependency of regressor variables‎, ‎which is called multicollinearity...

متن کامل

Robust variable selection for mixture linear regression models

In this paper, we propose a robust variable selection to estimate and select relevant covariates for the finite mixture of linear regression models by assuming that the error terms follow a Laplace distribution to the data after trimming the high leverage points. We introduce a revised Expectation-maximization (EM) algorithm for numerical computation. Simulation studies indicate that the propos...

متن کامل

Variable selection in linear regression through adaptive penalty selection

Model selection procedures often use a fixed penalty, such as Mallows’ Cp, to avoid choosing a model which fits a particular data set extremely well. These procedures are often devised to give an unbiased risk estimate when a particular chosen model is used to predict future responses. As a correction for not including the variability induced in model selection, generalized degrees of freedom i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Statistics & Data Analysis

سال: 2023

ISSN: ['0167-9473', '1872-7352']

DOI: https://doi.org/10.1016/j.csda.2023.107730